Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
fs-hdfs3
It's based on the version 0.0.4
of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.
Current Status
- All libhdfs FFI APIs are ported.
- Safe Rust wrapping APIs to cover most of the libhdfs APIs except those related to zero-copy read.
- Compared to hdfs-rs, it removes the lifetime in HdfsFs, which will be more friendly for others to depend on.
Documentation
- [API documentation] (https://docs.rs/crate/fs-hdfs3)
Requirements
- The C related files are from the branch
3.1.4
of hadoop repository. For rust usage, a few changes are also applied. - No need to compile the Hadoop native library by yourself. However, the Hadoop jar dependencies are still required.
Usage
Add this to your Cargo.toml:
[]
= "0.1.12"
Build
We need to specify $JAVA_HOME
to make Java shared library available for building.
Run
Since our compiled libhdfs is JNI-based implementation,
it requires Hadoop-related classes available through CLASSPATH
. An example,
Also, we need to specify the JVM dynamic library path for the application to load the JVM shared library at runtime.
For jdk8 and macOS, it's
For jdk11 (or later jdks) and macOS, it's
For jdk8 and Centos
For jdk11 (or later jdks) and Centos
Testing
The test also requires the CLASSPATH
and DYLD_LIBRARY_PATH
(or LD_LIBRARY_PATH
). In case that the java class of org.junit.Assert
can't be found. Refine the $CLASSPATH
as follows:
Here, $HADOOP_HOME
need to be specified and exported.
Then you can run
Example
use Arc;
use ;
let fs: = get_hdfs_by_full_path.ok.unwrap;
match fs.mkdir ;